Bounds on the number of hidden neurons in multilayer perceptrons

نویسندگان

  • Shih-Chi Huang
  • Yih-Fang Huang
چکیده

Fundamental issues concerning the capability of multilayer perceptrons with one hidden layer are investigated. The studies are focused on realizations of functions which map from a finite subset of E(n) into E(d). Real-valued and binary-valued functions are considered. In particular, a least upper bound is derived for the number of hidden neurons needed to realize an arbitrary function which maps from a finite subset of E(n ) into E(d). A nontrivial lower bound is also obtained for realizations of injective functions. This result can be applied in studies of pattern recognition and database retrieval. An upper bound is given for realizing binary-valued functions that are related to pattern-classification problems.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bounds on the Degree of High Order Binary Perceptrons Bounds on the Degree of High Order Binary Perceptrons

High order perceptrons are often used in order to reduce the size of neural networks. The complexity of the architecture of a usual multilayer network is then turned into the complexity of the functions performed by each high order unit and in particular by the degree of their polynomials. The main result of this paper provides a bound on the degree of the polynomial of a high order perceptron,...

متن کامل

Estimating the Number of Components in a Mixture of Multilayer Perceptrons

BIC criterion is widely used by the neural-network community for model selection tasks, although its convergence properties are not always theoretically established. In this paper we will focus on estimating the number of components in a mixture of multilayer perceptrons and proving the convergence of the BIC criterion in this frame. The penalized marginal-likelihood for mixture models and hidd...

متن کامل

Orthogonal least square algorithm applied to the initialization of multi-layer perceptrons

An e cient procedure is proposed for initializing two-layer perceptrons and for determining the optimal number of hidden neurons. This is based on the Orthogonal Least Squares method, which is typical of RBF as well as Wavelet networks. Some experiments are discussed, in which the proposed method is coupled with standard backpropagation training and compared with random initialization.

متن کامل

Consistent estimation of the architecture of multilayer perceptrons

We consider regression models involving multilayer perceptrons (MLP) with one hidden layer and a Gaussian noise. The estimation of the parameters of the MLP can be done by maximizing the likelihood of the model. In this framework, it is difficult to determine the true number of hidden units using an information criterion, like the Bayesian information criteria (BIC), because the information mat...

متن کامل

Bounds on the degree of high order binary perceptrons

High order perceptrons are often used in order to reduce the size of neural networks The complexity of the architecture of a usual multilayer network is then turned into the complexity of the functions performed by each high order unit and in particular by the degree of their polynomials The main result of this paper provides a bound on the degree of the polynomial of a high order perceptron wh...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE transactions on neural networks

دوره 2 1  شماره 

صفحات  -

تاریخ انتشار 1991